Neural computing with coherent laser networks
نویسندگان
چکیده
Abstract We show that coherent laser networks (CLNs) exhibit emergent neural computing capabilities. The proposed scheme is built on harnessing the collective behavior of for storing a number phase patterns as stable fixed points governing dynamical equations and retrieving such through proper excitation conditions, thus exhibiting an associative memory property. It discussed despite large storage capacity network, overlap between fixed-point effectively limits pattern retrieval to only two images. Next, we this restriction can be uplifted by using nonreciprocal coupling lasers allows utilizing capacity. This work opens new possibilities computation with novel analog processors. In addition, underlying model here suggests energy-based recurrent network handles continuous data opposed Hopfield Boltzmann machines are intrinsically binary systems.
منابع مشابه
Fast Arithmetic Computing with Neural Networks
Neural networks can be viewed as circuits of highly interconnected parallel processing units called ‘neurons’. The most commonly used models of neurons are linear threshold gates or, when continuity or differentiability is required, elements with a sigmoid input-output function. Because of the recent advance in VLSI technology, neural network has also emerged as a new technology and has found w...
متن کاملComputing Iterative Roots with Neural Networks
Many real processes are composed of a n-fold repetition of some simpler process. If the whole process can be modelled with a neural network, we present a method to derive a model of the basic process, too, thus performing not only a systemidentification but also a decomposition into basic blocks. Mathematically this is equivalent to the problem of computing iterative or functional roots: Given ...
متن کاملSelf-assembled networks with neural computing attributes
Two-dimensional arrays of vertical quantum wire Esaki tunnel diodes, laterally connected to their nearest neighbors by resistive/capacitive connections, constitute a powerful and versatile neuromorphic architecture that can function as classical Boolean logic circuits, associative memory, image processors, and combinatorial optimizers. In this paper, we discuss the basic philosophy behind adopt...
متن کاملComputing with Almost Optimal Size Neural Networks
Artificial neural networks are comprised of an interconnected collection of certain nonlinear devices; examples of commonly used devices include linear threshold elements, sigmoidal elements and radial-basis elements. We employ results from harmonic analysis and the theory of rational approximation to obtain almost tight lower bounds on the size (i.e. number of elements) of neural networks. The...
متن کاملComputing with dynamic attractors in neural networks.
In this paper we report on some new architectures for neural computation, motivated in part by biological considerations. One of our goals is to demonstrate that it is just as easy for a neural net to compute with arbitrary attractors--oscillatory or chaotic--as with the more usual asymptotically stable fixed points. The advantages (if any) of such architectures are currently being investigated...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Nanophotonics
سال: 2023
ISSN: ['2192-8606', '2192-8614']
DOI: https://doi.org/10.1515/nanoph-2022-0367